Skip to content
This repository has been archived by the owner on Oct 25, 2024. It is now read-only.

Support huggingface popular weight format for weight-only quantization #1580

Merged
merged 8 commits into from
Jul 5, 2024

Conversation

PenghuiCheng
Copy link
Contributor

Type of Change

feature
No API changed

Description

Support huggingface woq model format for intel GPU

Expected Behavior & Potential Risk

support AutoGPTQ model on huggingface models hub for WOQ on intel GPU

Copy link

github-actions bot commented May 30, 2024

⚡ Required checks status: All passing 🟢

Groups summary

🟢 Format Scan Tests workflow
Check ID Status Error details
format-scan (pylint) success
format-scan (bandit) success
format-scan (cloc) success
format-scan (cpplint) success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/quantization/utils.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py.

🟢 Optimize Unit Test workflow
Check ID Status Error details
optimize-unit-test-baseline success
optimize-unit-test-PR-test success
Genreate-OptimizeUT-Report success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/quantization/utils.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py.

🟢 NeuralChat Unit Test
Check ID Status Error details
neuralchat-unit-test-baseline success
neuralchat-unit-test-PR-test success
Generate-NeuralChat-Report success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/quantization/utils.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py.

🟢 Engine Unit Test workflow
Check ID Status Error details
engine-unit-test-baseline success
engine-unit-test-PR-test success
Genreate-Engine-Report success

These checks are required after the changes to intel_extension_for_transformers/transformers/llm/quantization/utils.py, intel_extension_for_transformers/transformers/modeling/modeling_auto.py.


Thank you for your contribution! 💜

Note
This comment is automatically generated and will be updates every 180 seconds within the next 6 hours. If you have any other questions, contact VincyZhang or XuehaoSun for help.

@PenghuiCheng
Copy link
Contributor Author

depend on ipex gpu xetla kenel ready.

@PenghuiCheng PenghuiCheng removed the WIP label Jul 1, 2024
@kevinintel kevinintel merged commit 3e85ca9 into main Jul 5, 2024
20 checks passed
@kevinintel kevinintel deleted the penghuic/support_hf_format branch July 5, 2024 09:20
@DDEle DDEle restored the penghuic/support_hf_format branch July 8, 2024 01:02
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants